Goto

Collaborating Authors

 statistical dimension






On the role of entanglement and statistics in learning (Supplementary material)

Neural Information Processing Systems

Note that this is defined up to an absolute phase, i.e. Learning models In this section we first describe the learning models we will be concerned with in this paper. Such quantum examples have been investigated in prior works [6, 8, 9]. A natural way to extend the learning model is to allow the algorithm quantum statistical queries . QSQ model allows a quantum advantage in learning in this framework.



Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. This paper gives a new convex relaxation for sparse matrix factorization. They show the new relaxation has better denoising performance and lower statistical dimension compared to traditional l_1 or trace norms. However, the new convex relaxation is hard to compute. The paper also involves many potential applications, experiments on denosing performance and application to predict drug-target interactions.



Tight convex relaxations for sparse matrix factorization

Neural Information Processing Systems

Based on a new atomic norm, we propose a new convex formulation for sparse matrix factorization problems in which the number of nonzero elements of the factors is assumed fixed and known. The formulation counts sparse PCA with multiple factors, subspace clustering and low-rank sparse bilinear regression as potential applications. We compute slow rates and an upper bound on the statistical dimension of the suggested norm for rank 1 matrices, showing that its statistical dimension is an order of magnitude smaller than the usual l_1-norm, trace norm and their combinations. Even though our convex formulation is in theory hard and does not lead to provably polynomial time algorithmic schemes, we propose an active set algorithm leveraging the structure of the convex problem to solve it and show promising numerical results.